| Name | Version | Summary | date | 
        
            
                | tensorzero | 2025.10.8 | The Python client for TensorZero | 2025-10-30 20:48:28 | 
        
            
                | lamb-abm | 0.1.0 | A unified framework for building agent-based models with Large Language Model integration | 2025-10-28 01:23:11 | 
        
            
                | lmurg | 1.0.4 | LLM-Based Neural Network Generator | 2025-10-27 20:29:42 | 
        
            
                | nn-gpt | 1.0.4 | LLM-Based Neural Network Generator | 2025-10-27 20:19:58 | 
        
            
                | openai-http-proxy | 2.0.0 | "OpenAI HTTP Proxy" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. | 2025-10-26 17:36:47 | 
        
            
                | oai-proxy | 2.0.0 | "OAI Proxy" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. | 2025-10-26 17:36:45 | 
        
            
                | lm-proxy-server | 2.0.0 | "LM Proxy Server" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. | 2025-10-26 17:36:43 | 
        
            
                | lm-proxy | 2.0.0 | "LM-Proxy" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. | 2025-10-26 17:36:42 | 
        
            
                | llm-proxy-server | 2.0.0 | "LLM Proxy Server" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. | 2025-10-26 17:36:40 | 
        
            
                | inference-proxy | 2.0.0 | "Inference Proxy" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. | 2025-10-26 17:36:38 | 
        
            
                | ai-proxy-server | 2.0.0 | "AI Proxy Server" is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc. | 2025-10-26 17:36:36 | 
        
            
                | llm4time | 0.5.0 | Um pacote para previsão de séries temporais usando modelos de linguagem. | 2025-10-25 20:04:07 | 
        
            
                | toponymy | 0.4.0 | A library for using large language models to name topics | 2025-10-09 21:20:47 | 
        
            
                | nestful-wrapper | 0.1.7 | A python wrapper over NESTFUL data | 2025-10-09 04:30:06 | 
        
            
                | ai-microcore | 4.4.3 | # Minimalistic Foundation for AI Applications | 2025-10-08 13:48:20 | 
        
            
                | llmcompressor | 0.8.1 | A library for compressing large language models utilizing the latest techniques and research in the field for both training aware and post training techniques. The library is designed to be flexible and easy to use on top of PyTorch and HuggingFace Transformers, allowing for quick experimentation. | 2025-10-08 02:12:55 | 
        
            
                | plankode | 0.0.0a0 | Unified Token Rendering Library for DeepSeek Models. | 2025-10-07 10:53:05 | 
        
            
                | scorio | 0.0.1 | Bayesian evaluation and ranking toolkit | 2025-10-06 23:28:40 | 
        
            
                | evalia | 0.0.1 | Bayesian evaluation and ranking toolkit | 2025-10-06 05:33:41 | 
        
            
                | applydir | 0.4.0 | Utility to apply changes to files based on AI-generated recommendations. | 2025-10-06 04:49:44 |